A restarted and modified simplex search for unconstrained optimization
نویسندگان
چکیده
We propose in this paper a simple but efficient modification of the well-known Nelder-Mead (NM) simplex search method for unconstrained optimization. Instead of moving all n simplex vertices at once in the direction of the best vertex, our ”shrink” step moves them in the same direction but one by one until an improvement is obtained. In addition, for solving non-convex problems, we simply restart the so modified NM (MNM) method by constructing an initial simplex around the solution obtained in the previous phase. We repeat restarts until there is no improvement in the objective function value. Thus, our restarted modified NM (RMNM) is a descent and deterministic method and may be seen as an extended local search for continuous optimization. In order to improve computational complexity and efficiency, we use the heap data structure for storing and updating simplex vertices. Extensive empirical analysis shows that: our modified method outperforms in average the original version as well as some other recent successful modifications; in solving global optimization problems, it is comparable with the state-of-the-art heuristics.
منابع مشابه
Augmented Downhill Simplex a Modified Heuristic Optimization Method
Augmented Downhill Simplex Method (ADSM) is introduced here, that is a heuristic combination of Downhill Simplex Method (DSM) with Random Search algorithm. In fact, DSM is an interpretable nonlinear local optimization method. However, it is a local exploitation algorithm; so, it can be trapped in a local minimum. In contrast, random search is a global exploration, but less efficient. Here, rand...
متن کاملA Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملThe modified BFGS method with new secant relation for unconstrained optimization problems
Using Taylor's series we propose a modified secant relation to get a more accurate approximation of the second curvature of the objective function. Then, based on this modified secant relation we present a new BFGS method for solving unconstrained optimization problems. The proposed method make use of both gradient and function values while the usual secant relation uses only gradient values. U...
متن کاملA New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
متن کاملA new hybrid conjugate gradient algorithm for unconstrained optimization
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Computers & OR
دوره 36 شماره
صفحات -
تاریخ انتشار 2009